Markov chain Monte Carlo sampling of a normal distribution from an exponential distribution
نویسنده
چکیده
The following is a simple example to show two important properties of a Markov chain Monte Carlo (MCMC) sampler and to illustrate the basic functionality of the method and issues relating to it’s usage. A Markov chain is a series where the realisation of the next element in the series, Y , is dependent only on the current state, X, and occurs with probability, P (Y |X). So the even number series forms a Markov chain but the Fibonacci sequence does not. A particle trajectory is a Markov chain but the formation of a polymer is not. MCMC is a technique for sampling from space that might be difficult to sample from because it does not have an analytical form or it may be difficult to integrate, in which case a normalising constant and a probability density function (pdf) cannot be found. The elements might be a numerical result from Monte Carlo sampling within a multivariate system, which can not be sampled directly, but are generated from other sampled variables. MCMC enables independent sampling of a target distribution, f , by using samples from a integrable function, f ∗ in a similar fashion to importance sampling. The first important property of MCMC samplers is that the weighting function need not be know in advance but is effectively generated by a stochastic process. Samples are taken from the distribution f ∗ and accepted or rejected depending on the relative probabilities of the new state compared to the old state in the target distribution, f and the relative probabilities of selection in the integrable distribution f. Consequently, the second important property of MCMC samplers is that the probability of a state need never be known to better than a normalising constant. Hastings [1] introduces a general form for an MCMC sampler that is stationary which requires that given a sample from the target distribution, f , subsequent samples will remain in f . The probability of an arbitrary transition is given by P (Y |X) = Q(Y |X)α(X,Y ) where,
منابع مشابه
Bayesian Estimation of Generalized Exponential Distribution under Progressive First Failure Censored Sample
In this paper, we consider the maximum likelihood (ML) and Bayes estimation of the parameters of the generalized exponential distribution based on progressive first failure censored samples. We also consider the problem of predicting an independent future order statistics from the same distribution. However, since Bayes estimator do not exist in an explicit form for the parameters, Markov Chain...
متن کاملAn Improved Visible Normal Sampling Routine for the Beckmann Distribution
Recently, Heitz and D’Eon [2014] proposed a method for importance sampling the distribution of visible normals in the context of microfacet BSDF models. One of their sampling routines internally relies on a discontinuous mapping, which can cause problems in conjunction with Quasi Monte Carlo sampling and Markov Chain Monte Carlo integration. In this report, we develop an alternative method that...
متن کاملBayesian Prediction Based on Type-I Hybrid Censored Data from a General Class of Distributions
One and two-sample Bayesian prediction intervals based on Type-I hybrid censored for a general class of distribution 1-F(x)=[ah(x)+b] c are obtained. For the illustration of the developed results, the inverse Weibull distribution with two unknown parameters and the inverted exponential distribution are used as examples. Using the importance sampling technique and Markov Chain Monte Carlo (MCMC)...
متن کاملBayesian Nonparametric Reliability Analysis Using Dirichlet Process Mixture Model
Cheng, Nan, M.S., August 2011, Industrial and Systems Engineering Bayesian Nonparametric Reliability Analysis Using Dirichlet Process Mixture Model Director of Thesis: Tao Yuan This thesis develops a Bayesian nonparametric method based on Dirichlet Process Mixture Model (DPMM) and Markov chain Monte Carlo (MCMC) simulation algorithms to analyze non-repairable reliability lifetime data. Kernel d...
متن کاملA Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo...
متن کامل